The neural network models based on attention mechanism are mainly used in the field of aspect-level sentiment analysis. The dependencies between aspect words and opinion words, as well as the distances between aspect words and context words, are ignored by this type of models, which further leads to inaccurate classification of emotions by this type of models. To solve above problems, a Relational and Interactive Graph ATtention network (RI-GAT) model was established. Firstly, the semantic features of sentences were learned by the Long Short-Term Memory (LSTM) network. Then the learned semantic features were combined with the position information of sentences to generate new features. Finally the dependencies between various aspects words and opinion words were extracted from the new features, realizing efficient and comprehensive use of syntactic dependency information and position information. Experimental results on Laptop, Restaurant, and Twitter datasets show that compared to the suboptimal Dynamic Multi-channel Graph Convolutional Network (DM-GCN), RI-GAT model has the classification Accuracy (Acc) improved by 0.67, 1.65, and 1.36 percentage points, indicating that RI-GAT model can better establish the relationship between aspect words and opinion words, making sentiment classification more accurate.